skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Zhong, Weishun"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Quantum generative models hold the promise of accelerating or improving machine learning tasks by leveraging the probabilistic nature of quantum states, but the successful optimization of these models remains a difficult challenge. To tackle this challenge, we present a new architecture for quantum generative modeling that combines insights from classical machine learning and quantum phases of matter. In particular, our model utilizes both many-body localized (MBL) dynamics and hidden units to improve the optimization of the model. We demonstrate the applicability of our model on a diverse set of classical and quantum tasks, including a toy version of MNIST handwritten digits, quantum data obtained from quantum many-body states, and nonlocal parity data. Our architecture and algorithm provide novel strategies of utilizing quantum many-body systems as learning resources and reveal a powerful connection between disorder, interaction, and learning in quantum many-body systems. Published by the American Physical Society2024 
    more » « less
  2. Continuous attractors have been used to understand recent neuroscience experiments where persistent activity patterns encode internal representations of external attributes like head direction or spatial location. However, the conditions under which the emergent bump of neural activity in such networks can be manipulated by space and time-dependent external sensory or motor signals are not understood. Here, we find fundamental limits on how rapidly internal representations encoded along continuous attractors can be updated by an external signal. We apply these results to place cell networks to derive a velocity-dependent nonequilibrium memory capacity in neural networks. 
    more » « less